Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 20 de 32
1.
Sci Rep ; 13(1): 19445, 2023 Nov 09.
Article En | MEDLINE | ID: mdl-37945627

This work has developed and simulated a planar complementary Archimedes-based metamaterial absorber with the goal of its application in refractive index sensing. Unlike designs that employ multiple layers or numerous resonators within a single unit cell, our proposed absorber adopts a more streamlined approach. It consists of three layers, with an FR4 dielectric substrate sandwiched between two copper layers. It's important to note that the absorption characteristics of this design are polarization-dependent. This polarization dependence arises from the asymmetrical resonance behavior observed in both the x and y directions. The absorber exhibits impressive absorption rates at various resonance frequencies, namely 98.5% at f1 = 8.49 GHz, 77.1% at f2 = 8.88 GHz, 88.7% at f3 = 9.3 GHz, 98.2% at f4 = 9.87 GHz, 99.7% at f5 = 10.65 GHz, 83.4% at f6 = 11.58 GHz, and 99.9% at f7 = 12.24 GHz. Furthermore, the article explored the refractive index sensing capabilities of this structure by introducing a 1 mm analyte layer on top of the patch structure. Through refractive index sensing analysis, we've determined that this absorber-based sensor yields an impressive high-quality factor value of 84.5, highlighting its remarkable sensitivity and precision. A more profound comprehension of the physical mechanisms in action has been attained by examining the distribution of surface currents. Furthermore, the behavior of the absorber has been investigated under varying polarization and incident angle conditions, ranging from zero degrees to sixty degrees. The thorough characterization establishes this absorber as a promising choice for microwave sensing applications.

2.
JMIR Public Health Surveill ; 8(9): e35973, 2022 09 27.
Article En | MEDLINE | ID: mdl-35544440

BACKGROUND: Disease surveillance is a critical function of public health, provides essential information about the disease burden and the clinical and epidemiologic parameters of disease, and is an important element of effective and timely case and contact tracing. The COVID-19 pandemic demonstrates the essential role of disease surveillance in preserving public health. In theory, the standard data formats and exchange methods provided by electronic health record (EHR) meaningful use should enable rapid health care data exchange in the setting of disruptive health care events, such as a pandemic. In reality, access to data remains challenging and, even if available, often lacks conformity to regulated standards. OBJECTIVE: We sought to use regulated interoperability standards already in production to generate awareness of regional bed capacity and enhance the capture of epidemiological risk factors and clinical variables among patients tested for SARS-CoV-2. We described the technical and operational components, governance model, and timelines required to implement the public health order that mandated electronic reporting of data from EHRs among hospitals in the Chicago jurisdiction. We also evaluated the data sources, infrastructure requirements, and the completeness of data supplied to the platform and the capacity to link these sources. METHODS: Following a public health order mandating data submission by all acute care hospitals in Chicago, we developed the technical infrastructure to combine multiple data feeds from those EHR systems-a regional data hub to enhance public health surveillance. A cloud-based environment was created that received ELR, consolidated clinical data architecture, and bed capacity data feeds from sites. Data governance was planned from the project initiation to aid in consensus and principles for data use. We measured the completeness of each feed and the match rate between feeds. RESULTS: Data from 88,906 persons from CCDA records among 14 facilities and 408,741 persons from ELR records among 88 facilities were submitted. Most (n=448,380, 90.1%) records could be matched between CCDA and ELR feeds. Data fields absent from ELR feeds included travel histories, clinical symptoms, and comorbidities. Less than 5% of CCDA data fields were empty. Merging CCDA with ELR data improved race, ethnicity, comorbidity, and hospitalization information data availability. CONCLUSIONS: We described the development of a citywide public health data hub for the surveillance of SARS-CoV-2 infection. We were able to assess the completeness of existing ELR feeds, augment those feeds with CCDA documents, establish secure transfer methods for data exchange, develop a cloud-based architecture to enable secure data storage and analytics, and produce dashboards for monitoring of capacity and the disease burden. We consider this public health and clinical data registry as an informative example of the power of common standards across EHRs and a potential template for future use of standards to improve public health surveillance.


COVID-19 , Health Information Exchange , COVID-19/epidemiology , Humans , Pandemics/prevention & control , Public Health , SARS-CoV-2
3.
Crit Care Explor ; 3(7): e0484, 2021 Jul.
Article En | MEDLINE | ID: mdl-34278314

Extracorporeal membrane oxygenator support is a powerful clinical tool that is currently enjoying a resurgence in popularity. Wider use of extracorporeal membrane oxygenator support is limited by its significant risk profile and extreme consumption of resources. This study examines the role of markers of liver dysfunction in predicting outcomes of adult patients requiring extracorporeal membrane oxygenator support. DESIGN: Retrospective review. SETTING: Large extracorporeal membrane oxygenator center, Chicago, IL. PATIENTS: This study reports a single institution experience examining all adult patients for whom extracorporeal membrane oxygenator support was used over an 8-year period. Data were collected regarding patient demographics, details of extracorporeal membrane oxygenator support provided, laboratory data, and outcomes. Trends in liver function were examined for their ability to predict survival. INTERVENTION: Extracorporeal membrane oxygenator support, critical care. MEASUREMENTS AND MAIN RESULTS: Mean age was 50 years (range, 19-82 yr). There were 86 male patients (56.6%) and 66 female patients (43.4%). Indications for initiation of extracorporeal membrane oxygenator support included cardiac 76 patients (50.0%), respiratory 48 patients (31.6%), extracorporeal cardiopulmonary resuscitation 21 patients (13.3%), and combined cardiac/respiratory seven patients (4.6%). Mean duration of extracorporeal membrane oxygenator support was 17 days (range 1-223 d) or median 8 days (interquartile range, 4-17 d). Overall, in-hospital mortality was 56% (86/152). Forty-five percent of adult patients (68/152) surpassed at least one of the following established liver dysfunction thresholds: total bilirubin greater than 15 mg/dL, aspartate aminotransferase greater than 20× upper limit of normal, and alanine aminotransferase greater than 20× upper limit of normal. The multivariable logistic analysis yielded three significant findings associated with in-hospital mortality: highest total bilirubin greater than 15 (adjusted odds ratio = 4.40; 95% CI, 1.19-21.87; p = 0.04), age (adjusted odds ratio = 1.03; 95% CI, 1.00-1.05; p = 0.04), and highest lactate (adjusted odds ratio = 1.15; 95% CI, 1.06-1.26; p = 0.002). CONCLUSIONS: Increases in age, highest total bilirubin, and lactate all correlated with in-hospital mortality in multivariable analysis of patients requiring extracorporeal membrane oxygenator support.

4.
Annu Rev Nutr ; 41: 203-222, 2021 10 11.
Article En | MEDLINE | ID: mdl-34143642

Proper timing of critical care nutrition has long been a matter of controversy. Critical illness waxes and wanes in stages, creating a dynamic flux in energy needs that we have only begun to examine. Furthermore, response to nutrition support likely differs greatly at the level of the individual patient in regard to genetic status, disease stage, comorbidities, and more. We review the observational and randomized literature concerning timing in nutrition support, discuss mechanisms of harm in feeding critically ill patients, and highlight the role of precision nutrition for moving the literature beyond the realm of blunt population averages into one that accounts for the patient-specific complexities of critical illness and host genetics.


Enteral Nutrition , Parenteral Nutrition , Critical Care , Critical Illness/therapy , Humans , Nutritional Support
5.
JPEN J Parenter Enteral Nutr ; 45(2): 287-294, 2021 02.
Article En | MEDLINE | ID: mdl-32885455

BACKGROUND: Converting nutrition support to energy results in mitochondrial free radical production, possibly increasing oxidative stress. Highly prevalent single nucleotide variants (SNV) exist for the genes encoding antioxidant enzymes responsible for the detoxification of reactive oxygen species. Our objective was to explore the interaction between nutrition support and genetic SNV's for two anti-oxidant proteins (rs4880 SNV for manganese superoxide dismutase and rs1050450 SNV for glutathione peroxidase 1) on oxidative stress and secondarily on intensive care unit (ICU) mortality. METHODS: We performed a post-hoc analysis on 34 mechanically ventilated sepsis patients from a randomized control feeding trial. Participants were dichotomized into those who carried both the rs4880 and the rs1050450 at-risk alleles (Risk Group) versus all others (Nonrisk Group). We explored the interaction between genotype and percent time spent in the upper median of energy exposure on oxidative stress and ICU mortality. RESULTS: Adjusting for confounders, the slope of log F2-isoprostane levels across percentage of days spent in the upper median of daily kilocalories per kilogram (kcal/kg) was 0.01 higher in the Risk Group compared to the Non-Risk Group (p=0.01). Every 1 percent increase in days spent in the upper median of daily kcal/kg was associated with an adjusted 10.3 percent increased odds of ICU mortality amongst participants in the Risk Group (odds ratio [OR]=1.103, p=0.06) but was highly insignificant in the Nonrisk group (OR=0.991, P=0.79). CONCLUSION: Nutrition support may lead to increased oxidative stress and worse clinical outcomes in a large percent of ICU patients with an at-risk genotype.


Enteral Nutrition , Superoxide Dismutase , Genotype , Glutathione Peroxidase , Humans , Oxidative Stress , Superoxide Dismutase/genetics , Glutathione Peroxidase GPX1
6.
J Card Surg ; 36(2): 743-747, 2021 Feb.
Article En | MEDLINE | ID: mdl-33350513

Granulomatosis with polyangiitis (GPA, also known as Wegener's granulomatosis) is a type of systematic vasculitis that primarily involves the lung and kidney. Diffuse alveolar hemorrhage (DAH) and associated acute respiratory failure are uncommon but devastating complications of GPA. Experience in using extracorporeal membrane oxygenation (ECMO) to manage DAH caused by GPA is limited. We report two GPA patients with DAH that were successfully managed using ECMO support. Examining 13 cases identified in the literature and two of our own, we observed that most patients experienced rapid deterioration in respiratory function in conjunction with a precedent respiratory infection. All 15 patients received veno-venous ECMO support. The median duration of ECMO support was 11 days (interquartile range: 7.5-20.75 days). Bleeding was the most common complication, seen in four (26.7%) cases. All patients were successfully weaned off ECMO after a median length of hospital stay of 42 days (interquartile range: 30-78 days). We demonstrated that the use of ECMO is a reasonable and effective support option in the management of GPA patients with DAH. The risk of bleeding is high but maybe reduced using a lower anticoagulation goal.


Extracorporeal Membrane Oxygenation , Granulomatosis with Polyangiitis , Lung Diseases , Respiratory Distress Syndrome , Granulomatosis with Polyangiitis/complications , Granulomatosis with Polyangiitis/therapy , Hemorrhage/etiology , Hemorrhage/therapy , Humans , Lung Diseases/etiology , Lung Diseases/therapy
7.
JPEN J Parenter Enteral Nutr ; 44(8): 1484-1491, 2020 11.
Article En | MEDLINE | ID: mdl-31995239

BACKGROUND: Prospective randomized controlled trials (PRCTs) that found harm in patients receiving higher levels of energy exposure have been largely ignored, in part because of the lack of a known mechanism of harm. OBJECTIVE: The current 7-day pilot study is a PRCT and post hoc analysis designed to explore the relationship between energy exposure and oxidative stress (as plasma total F2-isoprostanes) in mechanically ventilated intensive care unit patients with systemic inflammatory response syndrome. METHODS: Thirty-five participants were randomized to receive either 100% or 40% of their estimated energy needs. Our intent-to-treat model found no differences in F2-isoprostanes between groups. A post hoc analysis revealed that on days when participants were in the highest tertile of daily kcal/kg, the real-time energy flow rate within 2 hours of the blood draw was predictive of increased oxidative stress. On these days, participants in the second or third vs the first tertile of real-time energy flow rate experienced a 41.8% (P = .006) or 26.5% (P = .001) increase in F2-isoprostane levels, respectively. This was confirmed through a within-group subanalysis restricted to participants with measurements on both sides of the median of real-time energy flow rate that found a 28.2% F2-isoprostane increase on days in the upper vs lower median of flow rate (P = .002). CONCLUSION: The benefits of feeding may be more nuanced than previously suspected. Our findings imply a potential mechanism of harm in meeting the current recommendations for nutrition support in the critically ill that warrants further investigation.


Critical Illness , Enteral Nutrition , Humans , Oxidative Stress , Pilot Projects , Prospective Studies
8.
Am J Med Qual ; 35(3): 222-230, 2020.
Article En | MEDLINE | ID: mdl-31253048

In the United States, hospital rating system usefulness is limited by heterogeneity and conflicting results. US News Best Hospitals, Vizient Quality and Accountability Study, Centers for Medicare & Medicaid Services (CMS) Star Rating, Leapfrog Hospital Safety Grade, and the Truven Top 100 Hospitals ratings were compared using Spearman correlations. Rank aggregation was used to combine the scores generating a Quality Composite Rank (QCR). The highest correlation between rating systems was shown between the Leapfrog Safety Grade and the CMS Star Rating. In a proportional odds logistic regression, a greater discordance between the CMS Star Rating, Vizient rank, US News, and Leapfrog was associated with a lower overall rank in the QCR. Lack of transparency and understanding about the differences and similarities for these hospital ranking systems complicates use of the measures. By combining the results of these ranking systems into a composite, the measurement of hospital quality can be simplified.


Hospitals/standards , Quality of Health Care/organization & administration , Benchmarking/methods , Benchmarking/standards , Centers for Medicare and Medicaid Services, U.S. , Humans , Quality Indicators, Health Care , Quality of Health Care/standards , United States
9.
J Clin Endocrinol Metab ; 105(2)2020 02 01.
Article En | MEDLINE | ID: mdl-31581295

INTRODUCTION: The inflammatory response of critical illness is accompanied by nonthyroidal illness syndrome (NTIS). Feeding has been shown to attenuate this process, but this has not been explored prospectively over time in critically ill patients. OBJECTIVE: To explore the impact of calorie exposure on NTIS over time in critically ill patients. METHODS: Mechanically ventilated patients with systemic inflammatory response syndrome (SIRS) were randomized to receive either 100% or 40% of their estimated caloric needs (ECN). Thyroid hormones were measured daily for 7 days or until intensive care unit discharge or death. Mixed level regression modeling was used to explore the effect of randomization group on plasma triiodothyronine (T3), reverse triiodothyronine (rT3), thyroxine (T4), and thyroid stimulating hormone (TSH), as well as the T3/rT3 ratio. RESULTS: Thirty-five participants (n=19 in 100% ECN; n=16 in 40% ECN) were recruited. Adjusting for group differences in baseline T3/rT3 ratio, the parameters defining the fitted curves (intercept, linear effect of study day, and quadratic effect of study day) differed by randomization group (P = 0.001, P = 0.01, and P = 0.02 respectively). Plots of the fitted curves revealed that participants in the 100% ECN group had a 54% higher T3/rT3 ratio on postintervention day 1 compared with the 40% ECN group, a difference which attenuated over time. This was driven by a 23% higher plasma T3 and 10% lower plasma rT3 levels on postintervention 1. CONCLUSIONS: Higher caloric exposure in NTIS patients transiently attenuates the drop of the plasma T3/rT3 ratio, an effect that is minimized and finally lost over the following 3 days of continued higher caloric exposure.


Energy Intake/physiology , Enteral Nutrition/methods , Euthyroid Sick Syndromes/blood , Euthyroid Sick Syndromes/therapy , Critical Illness/therapy , Female , Humans , Male , Middle Aged , Regression Analysis , Respiration, Artificial , Thyrotropin/blood , Thyroxine/blood , Treatment Outcome , Triiodothyronine/blood , Triiodothyronine, Reverse/blood
10.
J Heart Lung Transplant ; 38(4): 433-439, 2019 04.
Article En | MEDLINE | ID: mdl-30744940

BACKGROUND: Bleeding and need for blood products are major complications associated with extracorporeal membrane oxygenation (ECMO) use. This study evaluated bleeding complications with low and standard heparinization protocols in the maintenance of venoarterial (VA)-ECMO. METHODS: A retrospective comparison was performed of 2 methods of heparinization in a contemporaneous series of adult patients supported with VA-ECMO at Rush University Medical Center, between November 2011 and November 2016. CentriMag (Thoratec, Pleasanton, CA) pumps, Quadrox (Maquet, Wayne, NJ) oxygenators, and heparin-bonded circuitry were used in all patients. Group 1 was a control group of 50 patients who had ECMO support with an initiation dose of 5,000 U of heparin, followed by standard heparinization at a goal activated clotting time of 180 to 220 seconds. Group 2 comprised 52 adult patients supported with a "low heparin protocol" ECMO, receiving a standard heparin bolus of 5,000 U for cannulation but without subsequent, ongoing heparin administration. Acuity of illness was similar in both groups as assessed by the Mortality Probability Model (59% in Group 1 vs 62.9% in Group 2, p = 0.08). Data were submitted to the Extracorporeal Life Support Organization prospectively. Clots in the circuit, limb ischemia, oxygenator failure, and embolic complications were recorded. RESULTS: Weaning off ECMO was successful in 26 patients (50%) in Group 2 compared with 18 (36%) in Group 1 (p = 0.05). Hemorrhage from the cannulation site occurred in 11 (21%) in Group 2 vs 21 (42%) in Group 1 and from the surgical site in 11 (21%) in Group 2 vs 18 (36%) in Group 1. Severe bleeding complications were higher in the control group (Group 1, 32%) compared with Group 2 (11.5%; p = 0.012). CONCLUSIONS: Maintenance with low heparin is safe in patients supported by VA-ECMO. This strategy may reduce risk of severe bleeding and associated complications.


Anticoagulants/administration & dosage , Extracorporeal Membrane Oxygenation/adverse effects , Hemorrhage/etiology , Heparin/administration & dosage , Adult , Aged , Aged, 80 and over , Arteries , Extracorporeal Membrane Oxygenation/methods , Female , Humans , Male , Middle Aged , Retrospective Studies , Veins , Young Adult
11.
ASAIO J ; 65(1): 43-48, 2019 01.
Article En | MEDLINE | ID: mdl-29240627

Extracorporeal membrane oxygenation (ECMO) offers an opportunity for patient recovery through complete cardiopulmonary support but is associated with complications that limit duration and overall utility. We examine the role of ECMO as a potential bridge to high-risk cardiac surgery in otherwise inoperable cases. This study reports a retrospective, multi-institution experience examining all patients for whom ECMO was used preoperatively as a bridge to definitive cardiac surgery without exception. A consecutive patient database (December 2011 through August 2017) was utilized. European System for Cardiac Risk Evaluation (EuroSCORE) 2 was calculated as a metric of patient acuity and risk assessment. Observed and expected mortality were compared. Twelve adult patients fit inclusion criteria and were supported with ECMO during the study period. There were five males and seven females. Average age was 56 (39-77) years. All 12 patients were supported with venoarterial ECMO for cardiogenic shock. This was done in preparation for corrective conventional cardiac surgery. Definitive cardiac surgical procedures included complex valve (n = 5), left ventricular assist device (n = 3), coronary artery bypass grafting (CABG; n = 2), CABG/ventricular septal defect repair (n = 1), and mitral valve replacement/CABG (n = 1). Average time of ECMO support was 200 (range 113-379) hours. Three patients were decannulated from ECMO at the conclusion of definitive cardiac surgery. Risk assessed by Logistic EuroSCORE 2 ranged from 64% to 89%. Average EuroSCORE 2-predicted mortality representing all 12 patients was 77%. Thirty day mortality was 25% (3/12), and hospital mortality was 33% (4/12). Seven patients are still alive today, with a mean survival of 37 (range 2-64) months. Two deaths were associated with gastrointestinal bleeding and two with evolving liver failure. Mean difference between the EuroSCORE 2 prediction model and actual observed 30 day mortality rate was 42.33 (95% CI 36.86-47.98) with a two-tailed, one-sample t test value of p < 0.001. ECMO can successfully be utilized as a bridge to conventional cardiac surgical procedures in critically ill patients, with a historically high mortality.


Extracorporeal Membrane Oxygenation/methods , Shock, Cardiogenic/therapy , Adult , Aged , Cardiac Surgical Procedures/methods , Extracorporeal Membrane Oxygenation/mortality , Female , Hospital Mortality , Humans , Male , Middle Aged , Retrospective Studies , Risk Assessment , Shock, Cardiogenic/mortality
12.
Crit Care Med ; 47(1): 69-75, 2019 01.
Article En | MEDLINE | ID: mdl-30303837

OBJECTIVES: Among critically ill patients, the benefits of nutrition support may vary depending on severity of organ dysfunction. The objective of the current article was to explore the relationship between organ failure and calories exposure with hospital mortality during the first week of acute respiratory distress syndrome. DESIGN: Retrospective observational study. SETTING: Single-center ICU. PATIENTS: Adults admitted to the ICU with a diagnosis of acute respiratory distress syndrome. INTERVENTIONS: Calorie delivery from enteral nutrition, parenteral nutrition, propofol, and dextrose containing fluids were collected for 7 days following intubation. Sequential Organ Failure Assessment score was calculated at ICU admit and for the same 7 days to describe organ dysfunction; four different Sequential Organ Failure Assessment variables were created 1) Sequential Organ Failure Assessment at ICU admit, 2) average Sequential Organ Failure Assessment for the first 7 days following intubation, 3) the highest Sequential Organ Failure Assessment for the first 7 days following intubation, and 4) change in Sequential Organ Failure Assessment from intubation to 7 days later. MEASUREMENTS AND MAIN RESULTS: A total of 298 patients were included. Sequential Organ Failure Assessment at ICU admit, average Sequential Organ Failure Assessment for the first 7 days following intubation, highest Sequential Organ Failure Assessment for the first 7 days following intubation, change in Sequential Organ Failure Assessment from intubation to 7 days later, and calorie delivery the first 7 days following intubation were all associated with increased likelihood of mortality. Compared with patients with low organ failure and low-calorie delivery, those with high-calorie delivery and low organ failure, low-calorie delivery and high organ failure, and the combination of both high organ failure with high-calorie delivery were associated with an incremental increase in the likelihood or mortality. CONCLUSIONS: Organ failure appears to modify the relationship between calorie exposure and ICU outcome. Additional research is needed to identify appropriate thresholds for safe calorie exposure with increased organ failure.


Energy Intake , Intensive Care Units , Organ Dysfunction Scores , Respiratory Distress Syndrome/mortality , APACHE , Enteral Nutrition , Female , Humans , Male , Middle Aged , Parenteral Nutrition , Respiration, Artificial , Respiratory Distress Syndrome/therapy , Retrospective Studies
13.
JPEN J Parenter Enteral Nutr ; 43(1): 10-12, 2019 01.
Article En | MEDLINE | ID: mdl-30251356

We thank Drs. Patel, Martindale, and Heyland1 for their interest in our most recent study2 exploring the role of early exposure to recommended energy delivery in critically ill patients with acute respiratory distress syndrome. Their stated concerns center around 3 major areas: (1) the comparability of the population in our study2 with those of INTACT (Intensive Nutrition in Acute Lung Injury: A Clinical Trial); (2) their confusion on the analysis of the relationship between the likelihood of death with energy exposure (table 1 in our article2 ); and (3) their apparent misunderstanding of type I error, type II error, P-values, and power analysis. Our responses to each of these areas are detailed as follows.


Acute Lung Injury , Respiratory Distress Syndrome , Energy Intake , Humans , Intensive Care Units , Nutritional Status
14.
JPEN J Parenter Enteral Nutr ; 42(4): 739-747, 2018 May.
Article En | MEDLINE | ID: mdl-28662370

BACKGROUND: The Intensive Nutrition in Acute Lung Injury: Clinical Trial (INTACT), designed to evaluate outcomes of calorie delivery from acute respiratory distress syndrome (ARDS) diagnosis through hospital discharge, was stopped due to higher mortality in the intervention group. Post hoc analysis found timing and dose of calorie delivery influenced mortality. The objective of this retrospective cohort study was to determine if early vs late calorie exposure changed the hazard of death among a larger sample of patients with ARDS. METHODS: Adult patients who met the eligibility criteria for INTACT but did not participate were included. Daily calorie delivery was collected from the date INTACT eligibility was determined to extubation or death. Cox proportional hazards regression was used to model the relationship between hazard of hospital death with average calorie exposure received over increasing study days and after day 7. RESULTS: A total of 298 patients were included; overall mortality was 33%. Among patients who remained intubated at 1 week (n = 202), higher kcal/kg received from intensive care unit (ICU) days 1-6 increased hazards of subsequent death on days 7+ (hazard ratio [HR], 1.04; 95% confidence interval [CI], 1.01-1.06); kcal/kg received after ICU day 7 decreased the hazards of death on day 7+ (HR, 0.53; 95% CI, 0.33-0.84). CONCLUSIONS: Higher calorie exposure between ICU days 1 and 7 was associated with higher subsequent hazard of mortality, and provision of high-calorie exposure after day 8 decreased the hazards of death.


Acute Lung Injury/therapy , Critical Care/methods , Energy Intake , Intensive Care Units , Nutritional Support/adverse effects , Respiratory Distress Syndrome/therapy , Acute Lung Injury/mortality , Adult , Aged , Case-Control Studies , Female , Hospital Mortality , Humans , Male , Middle Aged , Nutritional Status , Nutritional Support/methods , Proportional Hazards Models , Respiratory Distress Syndrome/mortality , Retrospective Studies
15.
Am J Clin Nutr ; 105(2): 411-416, 2017 02.
Article En | MEDLINE | ID: mdl-27974311

BACKGROUND: Our trial INTACT (Intensive Nutrition in Acute Lung Injury Trial) was designed to compare the impact of feeding from acute lung injury (ALI) diagnosis to hospital discharge, an interval that, to our knowledge, has not yet been explored. It was stopped early because participants who were randomly assigned to energy intakes at nationally recommended amounts via intensive medical nutrition therapy experienced significantly higher mortality hazards than did those assigned to standard nutrition support care that provided energy at 55% of recommended concentrations. OBJECTIVE: We assessed the influence of dose and timing of feeding on hospital mortality. DESIGN: Participants (n = 78) were dichotomized as died or discharged alive. Associations between the energy and protein received overall, early (days 1-7), and late (days ≥8) and the hazards of hospital mortality were evaluated between groups with multivariable analysis methods. RESULTS: Higher overall energy intake predicted significantly higher mortality (OR: 1.14, 95% CI: 1.02, 1.27). Among participants enrolled for ≥8 d (n = 66), higher early energy intake significantly increased the HR for mortality (HR: 1.17, 95% CI: 1.07, 1.28), whereas higher late energy intake was significantly protective (HR: 0.91, 95% CI: 0.83, 1.0). Results were similar for early but not late protein (grams per kilogram) exposure (early-exposure HR: 8.9, 95% CI: 2.3, 34.3; late-exposure HR: 0.15, 95% CI: 0.02, 1.1). Threshold analyses indicated early mean intakes ≥18 kcal/kg significantly increased subsequent mortality. CONCLUSIONS: Providing kilocalories per kilogram or grams of protein per kilogram early post-ALI diagnosis at recommended levels was associated with significantly higher hazards for mortality, whereas higher late energy intakes reduced mortality hazards. This time-varying effect violated the Cox proportionality assumption, indicating that feeding trials in similar populations should extend beyond 7 d and use time-varying statistical methods. Future trials are required for corroboration. INTACT was registered at clinicaltrials.gov as NCT01921101.


Acute Lung Injury/mortality , Energy Intake , Nutritional Support , Adult , Aged , Cross-Over Studies , Dietary Proteins/administration & dosage , Female , Humans , Male , Middle Aged , Nutritional Status , Patient Discharge , Proportional Hazards Models
16.
Jt Comm J Qual Patient Saf ; 42(10): 439-446, 2016.
Article En | MEDLINE | ID: mdl-27712602

BACKGROUND: Differences between the Centers for Medicare & Medicaid Services (CMS)-measured rates of safety events for Rush University Medical Center (RUMC; Chicago) and the U. S. News & World Report (USNWR)-deter mined patient safety score were evaluated in an attempt to validate the USNWR patient safety score-based ranking. METHODS: The USNWR findings for Patient Safety Indicators (PSIs) were compared with findings derived from RUMC internal billing data, and sensitivity analyses were conducted using a simulated data set derived from the Healthcare Cost and Utilization Project (HCUP) state inpatient data sets. RESULTS: Discrepancies were found for PSIs 3 (Pressure Ulcer Rate), 9 (Perioperative Hemorrhage or Hematoma Rate), and 11 (Postoperative Respiratory Failure Rate)-an excess of 0.72, 0.63, and 0.26 cases/1,000 admissions, in USNWR versus RUMC, respectively). The sensitivity analysis, which included missing present on admission (POA) flags and dates, resulted in an increase of rates by 1.83 (95% confidence interval [CI] = 1.10-2.56) cases/1,000 hospital- izations, 2.72 (CI = 0.00-5.90) cases/1,000 hospitalizations, and 3.89 (CI = 1.60-6.20) cases/1,000 hospitalizations for PSI 3, 9, and 11, respectively. Regression modeling showed that each 1% increase in transfers was associated with an in- crease of 0.06 cases of PSI 3/1,000 admissions; each 1,000 increase in admissions was associated with an increase of 0.04 cases of PSI 9/1,000 admissions. CONCLUSION: The USNWR data set produced inaccurate PSI rates for RUMC, and false-positive event rates were more common among high-transfer and high-volume hos- pitals. More transparency and validation is needed for con- sumer-based benchmarking methods. In response to these findings and concerns raised by others, in 2016 USNWR made changes to its methodology and data sources and reported them in announcing its 2016-17 Best Hospitals.


Hospitals/standards , Patient Safety/standards , Quality Indicators, Health Care , Centers for Medicare and Medicaid Services, U.S. , Chicago , Humans , United States
17.
Asian Cardiovasc Thorac Ann ; 24(6): 523-9, 2016 Jul.
Article En | MEDLINE | ID: mdl-27230516

BACKGROUND: The optimal timing of surgical treatment for infective endocarditis complicated by cerebrovascular events is controversial, largely due to the perceived risk of perioperative intracranial bleeding. Current guidelines suggest waiting 2 weeks between the diagnosis of stroke and surgery. The aim of this study was to investigate the clinical and neurological outcomes of early surgery following a stroke. METHODS: This was a single-center retrospective analysis of 12 consecutive patients requiring surgery for infective endocarditis between 2011 and 2014 at Rush University Medical Center, with either ischemic (n = 6) and/or hemorrhagic (n = 6) cerebrovascular complications. All underwent computed tomographic angiography prior to early valve reconstructive surgery to identify potentially actionable neurological findings. Early valve surgery was performed for ongoing sepsis or persistent emboli. Neurologic risk and outcome were assessed pre- and postoperatively using the National Institutes of Health Stroke Scale and the Glasgow Outcome Scale, respectively. RESULTS: All 12 patients underwent surgical treatment within 10 days of the diagnosis of stroke. Mortality in the immediate postoperative period was 8%. Eleven of the 12 patients exhibited good neurological recovery in the immediate postoperative period, with a Glasgow Outcome Scale score ≥ 3. There was no correlation between duration of cardiopulmonary bypass and neurological outcomes. CONCLUSION: Early cardiac surgery in patients with infective endocarditis and stroke maybe lifesaving with a low neurological risk. Comprehensive neurovascular imaging may help in identifying patient-related risk factors.


Cardiac Surgical Procedures , Endocarditis/surgery , Heart Valve Prosthesis Implantation , Heart Valves/surgery , Stroke/etiology , Time-to-Treatment , Adult , Aged , Aged, 80 and over , Cardiac Surgical Procedures/adverse effects , Cardiac Surgical Procedures/mortality , Cerebral Angiography/methods , Chicago , Computed Tomography Angiography , Endocarditis/complications , Endocarditis/diagnosis , Endocarditis/mortality , Female , Glasgow Outcome Scale , Heart Valve Prosthesis Implantation/adverse effects , Heart Valve Prosthesis Implantation/mortality , Heart Valves/diagnostic imaging , Humans , Magnetic Resonance Angiography , Male , Middle Aged , Neurologic Examination , Patient Selection , Retrospective Studies , Risk Factors , Stroke/diagnostic imaging , Stroke/mortality , Stroke/physiopathology , Time Factors , Treatment Outcome
19.
Clin Infect Dis ; 61(10): 1554-7, 2015 Nov 15.
Article En | MEDLINE | ID: mdl-26123936

The 2014-2015 Ebola virus disease (EVD) epidemic and international public health emergency has been referred to as a "black swan" event, or an event that is unlikely, hard to predict, and highly impactful once it occurs. The Chicago Ebola Response Network (CERN) was formed in response to EVD and is capable of receiving and managing new cases of EVD, while also laying the foundation for a public health network that can anticipate, manage, and prevent the next black swan public health event. By sharing expertise, risk, and resources among 4 major academic centers, Chicago created a sustainable network to respond to the latest in a series of public health emergencies. In this respect, CERN is a roadmap for how a region can prepare to respond to public health emergencies, thereby preventing negative impacts through planning and implementation.


Academic Medical Centers , Civil Defense/methods , Civil Defense/organization & administration , Epidemiologic Methods , Hemorrhagic Fever, Ebola/diagnosis , Hemorrhagic Fever, Ebola/prevention & control , Public Health/methods , Chicago , Hemorrhagic Fever, Ebola/transmission , Humans
...